76 research outputs found
Slice and Dice: A Physicalization Workflow for Anatomical Edutainment
During the last decades, anatomy has become an interesting topic in
education---even for laymen or schoolchildren. As medical imaging techniques
become increasingly sophisticated, virtual anatomical education applications
have emerged. Still, anatomical models are often preferred, as they facilitate
3D localization of anatomical structures. Recently, data physicalizations
(i.e., physical visualizations) have proven to be effective and
engaging---sometimes, even more than their virtual counterparts. So far,
medical data physicalizations involve mainly 3D printing, which is still
expensive and cumbersome. We investigate alternative forms of physicalizations,
which use readily available technologies (home printers) and inexpensive
materials (paper or semi-transparent films) to generate crafts for anatomical
edutainment. To the best of our knowledge, this is the first computer-generated
crafting approach within an anatomical edutainment context. Our approach
follows a cost-effective, simple, and easy-to-employ workflow, resulting in
assemblable data sculptures (i.e., semi-transparent sliceforms). It primarily
supports volumetric data (such as CT or MRI), but mesh data can also be
imported. An octree slices the imported volume and an optimization step
simplifies the slice configuration, proposing the optimal order for easy
assembly. A packing algorithm places the resulting slices with their labels,
annotations, and assembly instructions on a paper or transparent film of
user-selected size, to be printed, assembled into a sliceform, and explored. We
conducted two user studies to assess our approach, demonstrating that it is an
initial positive step towards the successful creation of interactive and
engaging anatomical physicalizations
ScaleTrotter: Illustrative Visual Travels Across Negative Scales
We present ScaleTrotter, a conceptual framework for an interactive,
multi-scale visualization of biological mesoscale data and, specifically,
genome data. ScaleTrotter allows viewers to smoothly transition from the
nucleus of a cell to the atomistic composition of the DNA, while bridging
several orders of magnitude in scale. The challenges in creating an interactive
visualization of genome data are fundamentally different in several ways from
those in other domains like astronomy that require a multi-scale representation
as well. First, genome data has intertwined scale levels---the DNA is an
extremely long, connected molecule that manifests itself at all scale levels.
Second, elements of the DNA do not disappear as one zooms out---instead the
scale levels at which they are observed group these elements differently.
Third, we have detailed information and thus geometry for the entire dataset
and for all scale levels, posing a challenge for interactive visual
exploration. Finally, the conceptual scale levels for genome data are close in
scale space, requiring us to find ways to visually embed a smaller scale into a
coarser one. We address these challenges by creating a new multi-scale
visualization concept. We use a scale-dependent camera model that controls the
visual embedding of the scales into their respective parents, the rendering of
a subset of the scale hierarchy, and the location, size, and scope of the view.
In traversing the scales, ScaleTrotter is roaming between 2D and 3D visual
representations that are depicted in integrated visuals. We discuss,
specifically, how this form of multi-scale visualization follows from the
specific characteristics of the genome data and describe its implementation.
Finally, we discuss the implications of our work to the general illustrative
depiction of multi-scale data
Hybrid visibility compositing and masking for illustrative rendering
In this paper, we introduce a novel framework for the compositing of interactively rendered 3D layers tailored to the needs of scientific illustration. Currently, traditional scientific illustrations are produced in a series of composition stages, combining different pictorial elements using 2D digital layering. Our approach extends the layer metaphor into 3D without giving up the advantages of 2D methods. The new compositing approach allows for effects such as selective transparency, occlusion overrides, and soft depth buffering. Furthermore, we show how common manipulation techniques such as masking can be integrated into this concept. These tools behave just like in 2D, but their influence extends beyond a single viewpoint. Since the presented approach makes no assumptions about the underlying rendering algorithms, layers can be generated based on polygonal geometry, volumetric data, point-based representations, or others. Our implementation exploits current graphics hardware and permits real-time interaction and rendering.publishedVersio
- …